专利摘要:
The invention relates to a method and a system (100) for preventing anti-forensic actions. The method identifies a suspicious object from a plurality of objects on a computer device (102) monitors actions performed by the suspect object. The method intercepts a first command from the suspect object, aimed at creating and / or modifying a digital artifact on the computer device and, after the interception of the first command, intercepts a second command from the suspect object, aimed at eliminating at least a suspicious object and the digital artifact. In response to the interception of both the first command aimed at creating and / or modifying the digital artifact and the second command aimed at eliminating at least the suspicious object and the digital artifact, the method blocks the second command and saves the suspect object and the digital artifact in a digital archive (116). The system is suitable for performing the method described above. [Fig. 1]
公开号:CH716699A2
申请号:CH01070/20
申请日:2020-08-31
公开日:2021-04-15
发明作者:Strogov Vladimir;Ishanov Oleg;Dod Alexey;Beloussov Serguei;Protasov Stanislav
申请人:Acronis Int Gmbh;
IPC主号:
专利说明:

FIELD OF TECHNOLOGY
[0001] The present disclosure refers to the field of data security and, more specifically, to systems and methods for the prevention of actions by suspicious objects capable of nullifying digital forensic science investigations.
BACKGROUND
As digital information technology is increasingly relied upon today, the number of computer crimes such as hacking, data theft and malware attacks has increased in parallel. As a result, cybersecurity methods have shifted the focus to tracking malicious software for analysis after an attack has occurred. Some advanced malicious software, however, not only damages, steals data, decrypts files, etc., but also removes all traces of their presence from a computer system. Once malicious software is committed, malicious software can self-destruct and prevent its activities from being tracked, making subsequent attempts at digital investigation extremely difficult if not impossible.
Conventional counter-action methods, using anti-rootkit programs, are not specialized enough to take due account of the advanced nature of such malicious software. Anti-rootkit programs can detect specific malicious software but are not designed to store information about rootkit activity and related artifacts. This requires a specialized solution that can prevent the destruction, deletion and concealment of traces of malicious software activity.
COMPENDIUM
[0004] To overcome these shortcomings, the present disclosure describes methods and systems for preventing actions by suspicious objects capable of nullifying digital forensic science investigations.
In one example, the method can identify a suspicious object from a plurality of objects on a computer device and monitor actions performed by the suspect object, where the actions include commands and requests originating from the suspect object. The method can intercept a first command from a suspicious object, aimed at creating and / or modifying a digital artifact on the computer device and, after the interception of the first command, intercept a second command from a suspicious object, aimed at eliminating at least one suspicious object and the digital artifact. In response to the interception of both the first command aimed at creating and / or modifying the digital artifact and the second command aimed at eliminating at least one suspicious object and the digital artifact, the method can block the second command and can save the object suspect and the digital artifact in a digital archive.
In some examples, the method can save contents of the digital archive through a backup of the system and user data on the computer device.
In some examples, the method can save the respective positions of the suspect object and the digital artifact in the digital archive.
In some examples, the method may save a record of all monitored actions of the suspicious object in the digital archive.
[0009] In some examples, the method can identify the suspect object from the plurality of objects on the computer device: extracting, for each respective object of the plurality of objects, a digital signature of the respective object, determining whether the digital signature of the respective object matches to any reliable digital signature in a list of allowed digital signatures and, in response to the no-match determination, identify the respective object as the suspect object.
In some examples, where a plurality of suspicious objects are identified, the method can monitor the plurality of suspicious objects for a threshold time period, wherein the plurality of suspicious objects includes the suspect object. The method can also identify a subset of suspicious objects that have not performed, during the threshold time period, actions that degrade the performance of the computing device or compromise the user's privacy on the computing device. The method can determine that the subset of suspicious objects does not contain suspicious objects and can stop monitoring the subset.
In some examples, the method may detect that the digital artifact has created and / or modified another digital artifact on the computing device. In response to interception of a third command by a digital artifact and the other digital artifact to delete the suspicious object, the method can block the third command and save the suspicious object, the digital artifact, and the other digital artifact in the digital archive.
[0012] It should be noted that the methods described above can be implemented in a system comprising a hardware processor. Alternatively, the methods can be implemented using machine-executable instructions of a non-transient machine-readable medium.
The aforementioned simplified compendium of embodiments by way of example serves to provide a basic understanding of the present disclosure. Said compendium is not an exhaustive overview of all contemplated embodiments and is not intended to identify key or fundamental elements of all embodiments, nor to outline the scope of one or all embodiments of the present disclosure. . Its sole purpose is to present one or more embodiments in simplified form in view of a more detailed description of the following disclosure. To achieve this objective, the form or embodiments of the present disclosure include the features described and indicated by way of example in the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are integrated and form an integral part of the present specifications, illustrate one or more exemplary aspects of the present disclosure and, together with the detailed description, serve to explain the respective principles and respective implementations.
[0015] FIG. 1 is a block diagram illustrating an example of a system to counter the removal of digital forensic science information by malicious software.
[0016] FIG. 2 is a block diagram illustrating an example of a method for monitoring activity by suspicious objects.
[0017] FIG. 3 illustrates a flowchart of an exemplary method for blocking an attempt to eliminate a suspicious object and / or its artifact.
[0018] FIG. 4 presents an example of a general purpose computer system, on which embodiments of the present disclosure can be implemented.
DETAILED DESCRIPTION
Exemplary aspects in the context of a computer program system, method and product for the purpose of generating and saving specific forensic metadata are described herein. It is clear to those skilled in the art that the following description is purely illustrative and is not intended to be limiting in any way. Other aspects will readily be suggested to those skilled in the art who will benefit from this disclosure. At this point reference will be made in detail to the implementations of the exemplary aspects illustrated in the accompanying drawings. The same reference indicators will be used to the extent possible through the drawings and subsequent description with reference to the same or similar elements.
[0020] FIG. 1 is a block diagram illustrating a system 100 for preventing anti-forensic actions by suspicious objects. The system 100 includes the computer device 102, which may include a personal computer, server, etc. which includes a central processing unit ("CPU") and memory which includes the software required to perform various tasks (eg operating system (OS) software, application software, etc.). The data for the computer device 102 can be stored in the memory of the device itself, as well as on other external devices such as backup servers 104, compact discs, flash drives, optical discs and the like.
In the present disclosure, the backup data 106 from the memory of the computer device 102 is transmitted to a backup server 104 via the network 108. The network 108 may be the Internet, a mobile phone network, a data network ( e.g. 4G or LTE network), Bluetooth or any combination thereof. For example, a backup server 104 can be part of a cloud computing environment via the Internet or it can be part of a local area network (LAN) with the computing device 102. The lines connecting the backup server 104 and the device computer 102 to network 108 represent communication paths that can include any combination of free space connections (e.g. for wireless signals) and physical connections (e.g. fiber optic cables).
It should be noted that there may be more than one backup server 104, but only one is illustrated in FIG. 1 to avoid overly complicating the drawing. For example, the backup server 104 can represent a plurality of servers in a distributed Cloud cluster. The backup server 104 may comprise any number of physical components (e.g., as illustrated in FIG. 4). For example, backup server 104 may comprise computers, physical block storage devices (eg Hard Disk Drives (HDDs), Solid State Drives (SSDs), flash drives, SMR discs, etc.) or memory (eg. , Random Access Memory (RAM)), I / O interface components, etc.
The backup data 106 can be any type of data, including user data, applications, system files, preferences, documents, media, etc. The computing device 102 can send backup data 106 for storage in the backup server 104 in accordance with the backup schedule indicating the specific data to be included in the backup data 106 and how often the data should be backed up . For example, the computer device 102 may generate a copy of a data file existing in the memory of the computer device 102 and transmit the copy as backup data 106 to the backup server 104 every two hours. The backup data 106 can be selected by a user of the computer device 102 and the frequency of the backup schedule can also be selected by a user.
The anti-forensic prevention module 110 may be part of an IDS (Intrusion Detection System) software. The anti-forensic prevention module 110, as illustrated in FIG. 1 is a client-end software but can also be divided into a thick client and thin client respectively between backup server 104 and IT device 102. In some examples, the anti-forensic prevention module 110 can be divided into at least three components: object identifier 112, activity analyzer 114 and digital archive 116. Object identifier 112 can be configured to identify suspicious objects from the plurality of objects in the computing device 102. An object can be any application, process, thread, file , data structure or PE-type executable file on computing device 102. Object identifier 112 can retrieve a list of all objects and identify suspicious objects in the list. For example, the object identifier 112 can retrieve a list of all objects on a computing device 102 (e.g., enumerating processes and threads, scanning files and applications, etc.) in an initial scan. In some examples, following the initial scan, in which a new object is created on the computing device 102 or an existing object is modified, the object identifier 112 can evaluate / re-evaluate the suspicious character of the new / modified object. In some examples, to evaluate the suspicious character, the object identifier 112 can compare the retrieved list of objects with an allowed list including objects that are trusted. Objects that are not in the allowed list are suspected by the object identifier 112.
[0025] Among the objects that can be observed due to their suspicious nature are executable files and various types of dynamic libraries that can be inserted into reliable processes (e.g. those present in the allowed list) using a third party malicious code, programs, scripts etc. The object identifier 112 can also determine whether an object has a reliable digital signature to determine its suspicious character. The lack of a reliable digital signature can indicate to the object identifier 112 that the object in question is a suspicious object. Object identifier 112 can analyze suspicious behavior of an object (e.g. injection of dynamic libraries into other processes deemed reliable), as well as in network activity (e.g. unusual download / upload items and destinations ).
The activity analyzer 114 can be configured to monitor actions performed by the suspicious object. Let's consider an example where the suspicious object is an executable file that generates another file. In this scenario, the action is the generation of another file. The other file is a digital artifact of the suspicious object. A digital artifact (also referred to as an artifact) can be any application, process, thread, file or data structure that is created / modified directly or indirectly by a suspicious object. In this example, since another file is created by the action taken by the suspicious object, the other file is identified as a digital artifact by the 114 activity analyzer. In another example, if the executable file has modified a existing file on the computing device 102, the activity analyzer 114 can identify the modified file as a digital artifact. In both examples, the suspicious object is directly creating / modifying a digital artifact. In the case of indirect creation / modification, the suspicious object can insert the code into a trusted object (making it a digital artifact) and the object with the inserted code can modify / create an additional object, which is also considered a digital artifact. While the suspicious object did not create / modify the supplementary object directly as it was created / modified by an object that directly affected the suspect object, the supplement object is considered a digital artifact. Therefore, an indirectly affected digital artifact can be any digital artifact affected by consequent action of the suspect object, even if the suspect object did not specifically target the digital artefact. Thus, in some examples, the activity analyzer 114 may also monitor actions involving the suspicious object (but not directly performed by the suspect object).
The activity analyzer 114 can specifically detect creation or modification actions by intercepting commands from suspicious objects. Once the activity analyzer 114 has identified the suspicious object and the digital artifact, the activity analyzer 114 determines whether any subsequent commands from the suspicious object or a digital artifact result in deletion of the object suspect itself or the created / modified digital artifact. Since the suspicious object is identified as suspicious and the command involves deleting a suspicious object (self-destruct) or at least one digital artifact of the suspicious object, the activity analyzer 114 can mark the suspicious object as malware who is trying to remove traces of himself. The forensic prevention module 110 can therefore block the delete command.
[0028] Consider an example of malware attacks targeting the computer device 102. The malware may have similar characteristics to "Flame," a modular computer malware discovered in 2012. Features may include the ability to record audio, take snapshots, monitor video keyboard activity, track network activity etc. Malware can use a variety of encryption methods and can save structured information captured in a SQLite database. Just as "flame" identifies anti-virus software installed on a target device and adapts its behavior to circumvent the security of anti-virus software (e.g. by changing file name extensions), malware can attempt to deceive security systems on the computer device 102. In addition, malware can have a "terminate" function, analogous to "flame", which eliminates all traces of the respective files and operations from the system.
[0029] Since antivirus software may not have a definition for such malware (as it may be new), it will be ineffective in detection. "Flame" is a program with a size of approximately 20 MB. When a scan of objects by object identifier 112 is performed, the malware program will be detected as an object. In some examples, since the object is not in a list of allowed items, it will be treated as a suspicious object by the Forensic Prevention Module 110. Copied screenshots, audio, and other information traced by the malware are identified as digital artifacts by the analyzer 114. Furthermore, the malware's attempt to "terminate" the scanned copies in such a way that the malware cannot be traced to the computer device 102 is considered to be a delete command. In response to the interception of the Quit command, the Forensic Prevention Module 110 can block the command from running since the malware is considered a suspicious object.
[0030] It should be noted that several objects can be characterized as suspicious objects, even if they are ultimately harmless objects. In some examples, to reduce the amount of resources devoted to monitoring a plurality of suspicious objects that are harmless, object identifier 112 may assume that a previously identified suspicious object is no longer a suspicious object after a period has elapsed threshold time, during which the object did not interact with other objects on the computing device 102, the object did not perform harmful actions that degrade the performance of the computing device 102 (e.g. using more than one power threshold of the CPU, RAM, storage, etc. for a period of time exceeding the predetermined time) or the object has not compromised the user's privacy on the computer device 102 (e.g. by monitoring, saving and sending data elsewhere).
The anti-forensic prevention module 110 can furthermore store the suspicious object and the digital artifact or digital artifacts (forming together anti-forensic data 118) in the digital archive 116. The digital archive 116 can be configured as an isolated archive (e.g. a quarantine) that prevents the suspicious object from carrying out further commands. In some examples, the anti-forensic prevention module 110 can generate anti-forensic data 118. The anti-forensic data 118 is backup data that includes the contents of the digital archive 116 (e.g. the suspicious object and the digital artifact) as well as information such as (1) the respective locations of the suspect object and the digital artifact in the memory of the computing device 102 and (2) all monitored actions of the suspect object as traced by the activity analyzer 114. In some examples, the information listed above is stored in the digital archive 116 along with the suspicious object and digital artifact.
In some examples, the anti-forensic prevention module 110 can further encrypt the suspicious object and the digital artifact (e.g. using algorithms such as Advanced Encryption Standard (AES), Rivest-Shamir-Adleman (RSA) etc. .) before placing them in the digital archive 116. This prevents a third party application from accessing the suspicious object or digital artifact in any way and removing the traces. In an example where the object and artifacts are signed by a public key, the private key may be stored on a device other than the computer device 102 to enhance security.
In some examples, the anti-forensic data 118 is saved with the backup data 106 in the backup server 104. For example, when the backup data 106 is loaded periodically, the anti-forensic data 118 is also subjected they back up. This allows the forensic technician conducting a forensic investigation to recreate the state of the computer device 102 and analyze the effects of the suspect object and the digital artifact under the condition of their respective locations in the memory of the computer device 102 (before being quarantined in the digital archive 116). The state of the computing device 102 can represent a fully restored image of the volume of the computing device 102 and metadata related to the cybercrime or virus attack, including the time of the event, the sources of malicious actions, the reproduction of corrupted data or infected.
[0034] FIG. 2 is a block diagram illustrating a method 200 for monitoring activity by suspicious objects. The source of malware 202 may be the source of the suspicious .exe file 204. Suppose the object identifier 112 has already identified the suspicious .exe file 204 as a suspicious object. The suspicious .exe file 204 can create and / or modify a plurality of artifacts (eg artifact 1, 2, .. N). The activity analyzer 114 can intercept attempts to create / modify and subsequently delete any suspicious artifacts or .exe files 204.
The activity analyzer 114 may contain two filters, namely, the file system filter 206 and the log filter 208. In some examples, the file system filter 206 scans for new artifacts in the memory of the computing device 102 and detects the removal of artifacts from memory. For example, when a file is created using the New Technology File System (NTFS), a log related to the file is added to the Master File Table (MFT). The MFT is a database that stores information about all files and directories on an NTFS volume. File System Filter 206 can monitor changes in the MFT to detect the addition and removal of digital artifacts.
The registry filter 208 can be configured to monitor registry changes of the computing device 102. Before an application can add data to a system registry, the application must create or open a key. On a Windows operating system, the application can use functions such as RegOpenKeyEx or RegCreateKeyEx to perform this task. Registry filter 208 can monitor a command line for these functions, as they are performed by a suspicious object. The suspicious object can use the RegSetValueEx function to bind a value and its data to the opened / created key. When intercepting commands, the activity analyzer 114 can monitor (via log filter 208) for this combination of functions. As illustrated in FIG. 2, each artifact has a file and a register value that the activity analyzer 114 can track. Following the creation / modification of a file identified as a digital artifact, the activity analyzer 114 monitors attempts to delete the suspicious file or .exe file 204. For example, the log filter 208 can monitor an attempt to run the RegDeleteKey or RegDeleteValue function. The first of these functions is used to delete a key from the registry, and the second of these functions is used to delete a value from a key. To prevent an attempt to hide traces of one's actions, the activity analyzer 114 can prevent the execution of functions.
The activity analyzer 114 can further quarantine the suspicious .exe file 204 and the artifacts 1, 2, ... N in the digital archive 116. The digital archive 116 may comprise the archive filter 210 which is intended to provide an efficient way to store collected objects and artifacts, including volume status information (e.g. a volume map) to understand where objects and artifacts were initially located (e.g. physical addresses on a volume on the computer device 102).
[0038] FIG. 3 illustrates a flowchart of method 300 for blocking an attempt to eliminate a suspicious object and / or its artifact. At operation 302, the object identifier 112 identifies a suspicious object from a plurality of objects in the computing device 102. For example, for each respective object of the plurality of objects, the object identifier 112 can extract a digital signature of the respective object and can determine whether the digital signature of the respective object matches any reliable digital signature in a list of allowed digital signatures. In response to the determination that there is no match, the object identifier 112 can identify the respective object as a suspicious object.
At operation 304, the activity analyzer 114 monitors the actions performed by the suspicious object (e.g. using the file system filter 206 and the log filter 208). At operation 306, the activity analyzer 114 intercepts a first command from the suspicious object to create and / or modify a digital artifact on the computer device 102.
At operation 308, the activity analyzer 114 intercepts a second command from the suspicious object. At operation 310, the activity analyzer 114 determines whether the second command is to delete the created / modified digital artifact. In response to determining that the second command is not to delete the created / modified digital artifact, the method 300 proceeds to operation 312, where the activity analyzer 114 determines whether the second command is to delete the suspicious object itself.
In response to the determination that the second command is not to delete the suspect object, method 300 returns to operation 304, where the activity analyzer 114 continues to monitor actions performed by the suspect object. However, if at operation 310, the activity analyzer 114 determines that the second command is to delete the created / modified digital artifact, or if at operation 312 the activity analyzer 114 determines that the second command is to delete the suspect object, method 300 proceeds to operation 314. Here, the anti-forensic prevention module 110 blocks the second command. At operation 316, the anti-forensic prevention module 110 stores the suspicious object and the digital artifact in the digital archive 116.
[0042] FIG. 4 is a block diagram illustrating a computer system 20 on which embodiments of systems and methods for anti-forensic prevention can be implemented. The computer system 20 may represent the computer device 102 and / or the backup server 104 and may be in the form of multiple computing devices or in the form of a single computing device, for example a desktop computer, a notebook, a laptop, a device mobile computing, a smartphone, tablet, server, mainframe, embedded device, and other forms of computing devices.
As illustrated, the computer system 20 includes a central processing unit (CPU) 21, a system memory 22 and a system bus 23 which connects the various components of the system, including the memory associated with the processing unit central 21. The system bus 23 can comprise a bus memory or a bus memory controller, a peripheral bus and a local bus which is capable of interacting with any other bus architecture. Examples of buses may include PCI, ISA, PCI-Express, HyperTransport ™, InfiniBand ™, Serial ATA, I <2> C, and other suitable interconnects. The central processing unit 21 (also called processor) can include one or more sets of processors equipped with single or multiple cores. The computer 21 may execute one or more computer-executable codes, implementing the techniques of the present disclosure. For example, any 200-300 methods performed through the anti-forensic prevention module 110 (eg through the respective components such as object identifier 112) can be performed by the computer 21. The system memory 22 can be any memory for the storage of used data and / or computer programs which are executable by the computer 21. System memory 22 may include volatile memory such as random access memory (RAM) 25 and non-volatile memory such as read-only memory (ROM) 24 , flash memory etc. or any combination thereof. The basic entry / exit system (BIOS) 26 can store basic procedures for transferring information between elements of the computer system 20, such as those when loading the operating system using ROM 24.
The computer system 20 may include one or more storage devices such as one or more removable storage devices 27, one or more non-removable storage devices 28, or a combination thereof. One or more removable storage devices 27 and non-removable storage devices 28 are connected to the system bus 23 via a storage interface 32. In one example, storage devices and corresponding computer-readable storage media are independent modules. 'power supply for storing computer instructions, data structures, program modules and other computer system data 20. System memory 22, removable storage devices 27, and non-removable storage devices 28 can use a variety of computer readable storage media. Examples of computer readable storage media include machine memory, such as cache, SRAM, DRAM, zero capacitor RAM, twin transistor RAM, eDRAM, EDO RAM, DDR RAM, EEPROM, NRAM, RRAM, SONOS, PRAM; flash memory or other storage technology such as solid state drives (SSDs) or flash drives; magnetic cassettes, magnetic tape and magnetic disk storage such as hard disk drives or floppy disks; optical archive such as compact discs (CD-ROMs) or digital versatile discs (DVDs); and any other medium that can be used to store the desired data and which can be accessed through the computer system 20.
The system memory 22, the removable storage devices 27 and the non-removable storage devices 28 of the computer system 20 can be used to store an operating system 35, further program applications 37, other program modules 38 and data of the program 39. The computer system 20 may include a peripheral interface 46 for communicating data from input devices 40, such as keyboard, mouse, stylus, game controller, voice input device, touch input device or other peripheral devices, such as a printer or scanner through one or more I / O ports, such as a serial port, parallel port, universal serial bus (USB), or other peripheral interface. A display device 47 such as one or more integrated monitors, projectors or displays, can be connected to the system bus 23 through an output interface 48, such as a video adapter. In addition to the display devices 47, the computer system 20 can be equipped with other peripheral output devices (not shown), such as speakers and other audiovisual devices.
The computer system 20 can operate in a network environment, using a network connection to one or more remote computers 49. The remote computer (s) 49 can be workstations or local computer servers, including most some or all of the above elements in describing the nature of a computer system 20. Other devices may also be present in the computer network, such as, for example, but not limited to, routers, network stations, peer devices or other network nodes. The computer system 20 may include one or more network interfaces 51 or network adapters for communicating with remote computers 49 via one or more networks such as a local range computer network (LAN) 50, a wide range computer network (WAN ), an intranet and the Internet. Examples of the network interface 51 may include an Ethernet interface, a Frame Relay interface, a SONET interface, and wireless interfaces.
Embodiments of the present disclosure may be a computer program system, method and / or product. The computer program product may include a computer readable storage medium (s) having computer readable program instructions thereon that causes a computer to perform aspects of this disclosure.
The computer-readable storage medium may be a tangible device that can store and store program code in the form of instructions or data structures accessible by a computer of a computing device, such as a computer system 20. Readable storage device may be an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination thereof. By way of example, such computer-readable storage medium may comprise a random access memory (RAM), a read-only memory (ROM), an EEPROM, a portable compact disk (CD-ROM) read-only memory, a versatile digital disk (DVD), a flash memory, a hard disk, a laptop disk, a memory stick, or even a mechanically encoded device such as punch cards or embossed structures in a groove on which instructions are recorded. According to this usage, the computer readable storage medium is not to be understood as transient signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or transmission media or electrical signals transmitted via cable.
The computer-readable program instructions described herein can be downloaded to the respective computing devices from a computer-readable storage medium or an external computer or an external storage device via a network, e.g. the Internet, an area network local, a wide-range network and / or a wireless network. The network may include copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, computer gateways, and / or perimeter servers. A network interface in each computing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium within the respective computing device .
The computer-readable program instructions for performing operations of the present disclosure can be assembly instructions, instruction set architecture (ISA), machine instructions, machine-dependent instructions, microcode, firmware instructions, data on state setting or source code or object code written in any combination of one or more programming languages, including an object-oriented programming language and conventional procedural programming languages. Computer-readable program instructions can run entirely on the user's computer, partly on the user's computer in the form of a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the computer or remote server. In the latter scenario, the remote computer can be connected to the user's computer via any type of network, including a LAN or WAN, or the connection can be made to an external computer (for example, via the Internet). In some embodiments, electronic circuitry, including, for example, programmable logic circuitry, field programmable gate arrays (FPGAs) or programmable logic arrays (PLAs), can execute computer-readable instructions using instruction status information. computer readable program to customize the electronic circuit in order to perform aspects of this disclosure.
In various aspects, the systems and methods described in the present disclosure can be addressed in terms of modules. The term "module" used here refers to a real world device, component or arrangement of components implemented via hardware, such as an application specific integrated circuit (ASIC) or FPGA, for example, or as a combination of hardware and software, for example from a microprocessor system and a set of instructions to implement the functionality of the module, which (during execution) transforms the microprocessor system into a device for specific purposes. A module can also be implemented as a combination of the two, with certain functions facilitated by hardware alone and other functions facilitated by a combination of hardware and software. In certain implementations, at least a portion, and in some cases all portions, of a module can be executed on the processor of a computer system. Consequently, each module can be made in a variety of suitable configurations and should not be limited to any particular implementation exemplified here.
For the sake of clarity, not all routine features of the aspects are disclosed here. It would be appreciated that, in the development of any actual implementation of this disclosure, numerous decisions are made for specific implementations in order to achieve the specific goals of the developer and those specific goals will vary for different implementations and different developers. It is understood that such a development effort can be complex and time-consuming but would nevertheless be a routine engineering task for those skilled in the art to take advantage of this disclosure.
[0053] Furthermore, it is understood that the phraseology or terminology used in this context is purely descriptive and non-limiting, so that the terminology or phraseology referred to in these specifications must be interpreted by experts in the field in the light of the teachings and guidelines presented here, in combination with the knowledge of experts in the respective field or fields. In addition, any terms in the specifications or claims are not to be construed as attributable to an unusual or special meaning, unless explicitly indicated as such.
The various aspects disclosed herein include known present and future equivalents to the known modules, referred to herein by way of illustration. Furthermore, although aspects and applications have been illustrated and described, it is evident to those skilled in the art that they have the benefit of consulting the present disclosure, that many more modifications are possible than those mentioned above, without departing from the inventive concepts disclosed herein.
权利要求:
Claims (14)
[1]
1. A method of preventing anti-forensic actions, the method comprising:identifying a suspicious object from a plurality of objects on a computer device;monitor actions performed by the suspect object, where the actions include commands and requests from the suspect object;intercept a first command from the suspicious object to create and / or modify a digital artifact on the computer device;following the interception of the first command, intercepting a second command from the suspect object to eliminate at least one suspect object and the digital artifact;in response to the interception of both the first command to create and / or modify the digital artifact and the second command to delete at least one suspicious object and the digital artifact:block the second command; issave the suspicious object and digital artifact to a digital archive.
[2]
The method according to claim 1, further comprising:save contents of the digital archive with a backup of the system and user data on the computer device.
[3]
The method according to claim 1, further comprising:save respective positions of the suspect object and the digital artifact in the digital archive.
[4]
The method according to claim 1, further comprising:save a record of all monitored actions of the suspicious object in the digital archive.
[5]
The method according to claim 1, wherein identifying the suspect object from the plurality of objects on the computing device comprises:for each respective object of the plurality of objects:extract a digital signature of the respective object;determine whether the digital signature of the respective object matches any reliable digital signature in a list of allowed digital signatures; isin response to determining that no match exists, identify the respective object as the suspect object.
[6]
The method of claim 1, wherein a plurality of suspicious objects is identified, further comprising:monitoring the plurality of suspicious objects for the threshold time period, wherein the plurality of suspicious objects includes the suspect object;identify a subset of suspicious objects that have not performed, during the threshold time period, actions that degrade the performance of the computing device or compromise the user's privacy on the computing device;determine that the subset of suspicious objects does not contain suspicious objects; isstop monitoring the subset.
[7]
The method according to claim 1, further comprising:detect that the digital artifact has created and / or modified another digital artifact on the computing device;in response to interception of a third command by a digital artifact and the other digital artifact, delete the suspicious object:block the second command; issave the suspicious object, the digital artifact and the other digital artifact in the digital archive.
[8]
8. A system for preventing anti-forensic actions, the system comprising:a hardware processor configured for:identifying a suspicious object from a plurality of objects on a computer device;monitor actions performed by the suspect object, where the actions include commands and requests from the suspect object;intercept a first command from the suspicious object to create and / or modify a digital artifact on the computer device;following the interception of the first command, intercepting a second command from the suspect object to eliminate at least one suspect object and the digital artifact;in response to the interception of both the first command to create and / or modify the digital artifact and the second command to delete at least one suspicious object and the digital artifact:block the second command; issave the suspicious object and digital artifact to a digital archive.
[9]
The system according to claim 8, wherein the hardware processor is further configured for:save contents of the digital archive with a backup of the system and user data on the computer device.
[10]
The system according to claim 8, wherein the hardware processor is further configured for:save respective positions of the suspect object and the digital artifact in the digital archive.
[11]
The system according to claim 8, wherein the hardware processor is further configured to:save a record of all monitored actions of the suspicious object in the digital archive.
[12]
The system according to claim 8, wherein the hardware processor is further configured to identify the suspect object from the plurality of objects on the computing device by:for each respective object of the plurality of objects:extract a digital signature of the respective object;determine whether the digital signature of the respective object matches any reliable digital signature in a list of allowed digital signatures; isin response to determining that no match exists, identify the respective object as the suspect object.
[13]
The system according to claim 8, wherein a plurality of suspicious objects are identified and wherein the hardware processor is further configured to:monitoring the plurality of suspicious objects for the threshold time period, wherein the plurality of suspicious objects includes the suspect object;identify a subset of suspicious objects that have not performed, during the threshold time period, actions that degrade the performance of the computing device or compromise the user's privacy on the computing device;determine that the subset of suspicious objects does not contain suspicious objects; and stop monitoring the subset.
[14]
The system according to claim 8, wherein the hardware processor is further configured to:detect that the digital artifact has created and / or modified another digital artifact on the computing device;in response to interception of a third command by a digital artifact and the other digital artifact, delete the suspicious object:block the third command; issave the suspicious object, the digital artifact and the other digital artifact in the digital archive.
类似技术:
公开号 | 公开日 | 专利标题
US11244047B2|2022-02-08|Intelligent backup and versioning
US9256739B1|2016-02-09|Systems and methods for using event-correlation graphs to generate remediation procedures
US10509906B2|2019-12-17|Automated code lockdown to reduce attack surface for software
Case et al.2017|Memory forensics: The path forward
US9202057B2|2015-12-01|Systems and methods for identifying private keys that have been compromised
US9065849B1|2015-06-23|Systems and methods for determining trustworthiness of software programs
US9158915B1|2015-10-13|Systems and methods for analyzing zero-day attacks
JP6196393B2|2017-09-13|System and method for optimizing scanning of pre-installed applications
US8955138B1|2015-02-10|Systems and methods for reevaluating apparently benign behavior on computing devices
RU2667052C2|2018-09-13|Detection of harmful software with cross-review
US10176329B2|2019-01-08|Systems and methods for detecting unknown vulnerabilities in computing processes
US9489513B1|2016-11-08|Systems and methods for securing computing devices against imposter processes
US10262131B2|2019-04-16|Systems and methods for obtaining information about security threats on endpoint devices
US9894085B1|2018-02-13|Systems and methods for categorizing processes as malicious
US9519780B1|2016-12-13|Systems and methods for identifying malware
US9659176B1|2017-05-23|Systems and methods for generating repair scripts that facilitate remediation of malware side-effects
US9652615B1|2017-05-16|Systems and methods for analyzing suspected malware
US11263295B2|2022-03-01|Systems and methods for intrusion detection and prevention using software patching and honeypots
CH716699A2|2021-04-15|Systems and methods to counter the removal of digital forensic information by malicious software.
US10546125B1|2020-01-28|Systems and methods for detecting malware using static analysis
CN110659478A|2020-01-07|Method for detecting malicious files that prevent analysis in an isolated environment
US20210019409A1|2021-01-21|System and method for identifying system files to be checked for malware using a remote service
Huang et al.2019|Identifying HID-based attacks through process event graph using guilt-by-association analysis
US10572663B1|2020-02-25|Systems and methods for identifying malicious file droppers
JP2021111384A|2021-08-02|System and method for protecting against unauthorized memory dump modification
同族专利:
公开号 | 公开日
EP3800567A1|2021-04-07|
US20210097182A1|2021-04-01|
JP2021064358A|2021-04-22|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

TWI656453B|2016-11-22|2019-04-11|財團法人資訊工業策進會|Detection system and detection method|
法律状态:
优先权:
申请号 | 申请日 | 专利标题
US201962908742P| true| 2019-10-01|2019-10-01|
US17/005,478|US20210097182A1|2019-10-01|2020-08-28|Systems and methods for countering removal of digital forensics information by malicious software|
[返回顶部]